# Whole word masking
Bert Base Swedish Cased
Swedish BERT base model released by the National Library of Sweden/KBLab, trained on multi-source texts
Large Language Model Other
B
KB
11.16k
21
Bert Base Swedish Cased Ner
Swedish BERT base model released by the National Library of Sweden/KBLab, trained on multi-source text data
Large Language Model Other
B
KBLab
245
5
Umberto Wikipedia Uncased V1
UmBERTo is an Italian language model based on the Roberta architecture, trained using SentencePiece and whole word masking techniques, suitable for various natural language processing tasks.
Large Language Model
Transformers Other

U
Musixmatch
1,079
7
Featured Recommended AI Models